Minimax lower bounds for function estimation on graphs
نویسندگان
چکیده
In recent years there has been substantial interest in high-dimensional estimation and prediction problems on large graphs. These can in many cases be seen as high-dimensional or nonparametric regression or classification problems in which the goal is to learn a “smooth” function on a given graph. Various methods have been proposed to deal with such problems, motivated by a variety of applications. Any sensible method employs some form of regularisation that takes the geometry of the graph into account. Examples of methods that have been considered include penalised least squares regression using a Laplacianbased penalty (e.g. Ando and Zhang (2007); Belkin et al. (2004); Kolaczyk (2009); Smola and Kondor (2003); Zhu and Hastie (2005)), penalisation using the total variation norm (e.g. Sadhanala et al. (2016)) and Bayesian regularisa-
منابع مشابه
Minimax Estimator of a Lower Bounded Parameter of a Discrete Distribution under a Squared Log Error Loss Function
The problem of estimating the parameter ?, when it is restricted to an interval of the form , in a class of discrete distributions, including Binomial Negative Binomial discrete Weibull and etc., is considered. We give necessary and sufficient conditions for which the Bayes estimator of with respect to a two points boundary supported prior is minimax under squared log error loss function....
متن کاملOn the Minimax Optimality of Block Thresholded Wavelets Estimators for ?-Mixing Process
We propose a wavelet based regression function estimator for the estimation of the regression function for a sequence of ?-missing random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator based on block thresholding are investigated. It is found that the estimators achieve optimal minimax convergence rates over large class...
متن کاملMinimax Lower Bounds
Minimax Lower Bounds Adityanand Guntuboyina 2011 This thesis deals with lower bounds for the minimax risk in general decision-theoretic problems. Such bounds are useful for assessing the quality of decision rules. After providing a unified treatment of existing techniques, we prove new lower bounds which involve f -divergences, a general class of dissimilarity measures between probability measu...
متن کاملOn Bayes Risk Lower Bounds
This paper provides a general technique for lower bounding the Bayes risk of statistical estimation, applicable to arbitrary loss functions and arbitrary prior distributions. A lower bound on the Bayes risk not only serves as a lower bound on the minimax risk, but also characterizes the fundamental limit of any estimator given the prior knowledge. Our bounds are based on the notion of f -inform...
متن کاملEstimation of Lower Bounded Scale Parameter of Rescaled F-distribution under Entropy Loss Function
We consider the problem of estimating the scale parameter &beta of a rescaled F-distribution when &beta has a lower bounded constraint of the form &beta&gea, under the entropy loss function. An admissible minimax estimator of the scale parameter &beta, which is the pointwise limit of a sequence of Bayes estimators, is given. Also in the class of truncated linear estimators, the admissible estim...
متن کاملTruncated Linear Minimax Estimator of a Power of the Scale Parameter in a Lower- Bounded Parameter Space
Minimax estimation problems with restricted parameter space reached increasing interest within the last two decades Some authors derived minimax and admissible estimators of bounded parameters under squared error loss and scale invariant squared error loss In some truncated estimation problems the most natural estimator to be considered is the truncated version of a classic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018